Learning Multiple Related Tasks using Latent Independent Component Analysis

نویسندگان

  • Jian Zhang
  • Zoubin Ghahramani
  • Yiming Yang
چکیده

We propose a probabilistic model based on Independent Component Analysis for learning multiple related tasks. In our model the task parameters are assumed to be generated from independent sources which account for the relatedness of the tasks. We use Laplace distributions to model hidden sources which makes it possible to identify the hidden, independent components instead of just modeling correlations. Furthermore, our model enjoys a sparsity property which makes it both parsimonious and robust. We also propose efficient algorithms for both empirical Bayes method and point estimation. Our experimental results on two multi-label text classification data sets show that the proposed approach is promising.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Effects of Task Orientation and Involvement Load on Learning Collocations

This study examined the effects of input-oriented and output-oriented tasks with different involvement load indices on Iranian EFL learners' comprehension and production of lexical collocations. To achieve this purpose, a sample of 180 intermediate-level EFL learners (both male and female) participated in the study. The participants were in six experimental groups. Each of the groups was random...

متن کامل

Fuzzy Local ICA for Extracting Independent Components Related to External Criteria

Independent component analysis (ICA) is an unsupervised technique for blind source separation, and the ICA algorithms using nongaussianity as the measure of mutual independence have been also used for projection pursuit or visualization of multivariate data for knowledge discovery in databases (KDD). However, in real applications, it is often the case that we fail to extract useful latent varia...

متن کامل

Differential learning algorithms for decorrelation and independent component analysis

Decorrelation and its higher-order generalization, independent component analysis (ICA), are fundamental and important tasks in unsupervised learning, that were studied mainly in the domain of Hebbian learning. In this paper we present a variation of the natural gradient ICA, differential ICA, where the learning relies on the concurrent change of output variables. We interpret the differential ...

متن کامل

Similarity Component Analysis

Measuring similarity is crucial to many learning tasks. To this end, metric learning has been the dominant paradigm. However, similarity is a richer and broader notion than what metrics entail. For example, similarity can arise from the process of aggregating the decisions of multiple latent components, where each latent component compares data in its own way by focusing on a different subset o...

متن کامل

Variational Learning in Nonlinear Gaussian Belief Networks

We view perceptual tasks such as vision and speech recognition as inference problems where the goal is to estimate the posterior distribution over latent variables (e.g., depth in stereo vision) given the sensory input. The recent flurry of research in independent component analysis exemplifies the importance of inferring the continuous-valued latent variables of input data. The latent variable...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005